What is peek-stream?
The peek-stream npm package is a Node.js module that allows you to peek at the beginning of a stream to determine its content before deciding how to process it. This can be useful when you need to inspect the start of a stream to make decisions based on its initial data without consuming the stream entirely.
What are peek-stream's main functionalities?
Peeking at stream content
This code sample demonstrates how to use peek-stream to inspect the first 10 bytes of a file stream. If the file starts with 'PNG', it swaps the stream with a new PngDecoderStream; otherwise, it continues with the original stream.
const peek = require('peek-stream');
const fs = require('fs');
const stream = fs.createReadStream('example.txt');
const peekStream = peek({ maxBuffer: 10 }, function(data, swap) {
// data is the first 10 bytes of the stream
if (data.toString().startsWith('PNG')) {
swap(null, new PngDecoderStream());
} else {
swap();
}
});
stream.pipe(peekStream).pipe(process.stdout);
Other packages similar to peek-stream
through2
Through2 is a tiny wrapper around Node.js streams.Transform that makes it easier to create transform streams. It is similar to peek-stream in that it allows you to manipulate stream data, but it does not provide the same functionality to peek at the stream before deciding how to process it.
stream-spigot
Stream-spigot is a module for creating readable streams that emit data from a given array or function. While it can be used to create streams with predetermined data, unlike peek-stream, it does not offer the ability to peek into an existing stream to make processing decisions.
pumpify
Pumpify combines an array of streams into a single duplex stream using pump and duplexify. It is useful for streamlining the process of piping streams together but does not have the peeking capability that peek-stream provides.
peek-stream
Transform stream that lets you peek the first line before deciding how to parse it
npm install peek-stream
Usage
var peek = require('peek-stream')
var ldjson = require('ldjson-stream')
var csv = require('csv-parser')
var isCSV = function(data) {
return data.toString().indexOf(',') > -1
}
var isJSON = function(data) {
try {
JSON.parse(data)
return true
} catch (err) {
return false
}
}
var parser = function() {
return peek(function(data, swap) {
if (isJSON(data)) return swap(null, ldjson())
if (isCSV(data)) return swap(null, csv())
swap(new Error('No parser available'))
})
}
The above parser will be able to parse both line delimited JSON and CSV
var parse = parser()
parse.write('{"hello":"world"}\n{"hello":"another"}\n')
parse.on('data', function(data) {
console.log(data)
})
Or
var parse = parser()
parse.write('test,header\nvalue-1,value-2\n')
parse.on('data', function(data) {
console.log(data)
})
Per default data
is the first line (or the first 65535
bytes if no newline is found).
To change the max buffer pass an options map to the constructor
var parse = peek({
maxBuffer: 10000
}, function(data, swap) {
...
})
If you want to emit an error if no newline is found set strict: true
as well.
License
MIT